937 research outputs found

    'Bridging the gap' - A survey of medical GPs' awareness of child dental neglect as a marker of potential systemic child neglect.

    Get PDF
    Background: Higher levels of tooth decay are seen in abused and neglected children. The medical general practitioner (GP)/family doctor is often the first point of contact within the UK National Health Service (NHS). Aim: We aimed to assess in the absence of the dentist whether GPs are sufficiently trained to identify dental neglect (DN) as a marker of child neglect (CN). Design and setting: A structured survey was sent to all NHS GPs on the Isle of Wight, UK (n = 106). Method: This survey examined the level of awareness and perceptions of GPs regarding the importance of the provision of dental health care in the identification of DN and CN. The level of training GPs had received to identify dental pathology was also assessed. Results Fifty-five GPs completed the survey (52%). The majority of GPs had never liaised with a dentist and 50% of the GPs believed childhood immunisations were more important than registration with a dentist. Ninety-six percent of GPs had never received any formal dental training and some did not perceive dental health to be important. Only 5 GPs mentioned a link between a lack of dental registration and CN and no GPs worked at clinics where child dental registration status was recorded. Conclusion: In the absence of formal recording, follow up and compulsory attendance at the dentist, the timely detection of DN and potential CN may be impaired. This study demonstrates that medical GPs are ill-equipped to detect DN, a recognised marker of broader neglect and therefore may miss an important opportunity to detect CN and improve child health and welfare

    Absolutely stable proton and lowering the gauge unification scale

    Get PDF
    A unified model is constructed, based on flipped SU(5) in which the proton is absolutely stable. The model requires the existence of new leptons with masses of order the weak scale. The possibility that the unification scale could be extremely low is discussed

    Surface-Atmosphere Coupling Scale, the Fate of Water, and Ecophysiological Function in a Brazilian Forest

    Get PDF
    This is the final verison. Available from American Geophysical Union (AGU) via the DOI in this record.The K83 observational data are available from AmeriFlux (ameriflux.lbl.gov), NCEP Reanalysis data provided by NOAA/ESRL/PSD, Boulder, Colorado, USA, from the http://www.cdc.noaa.gov/ website. Model code and output is stored at GitLab (gitlab.com). This project is password protected, and the password can be obtained from the corresponding author at [email protected] upon request.Tropical South America plays a central role in global climate. Bowen ratio teleconnects to circulation and precipitation processes far afield, and the global CO2 growth rate is strongly influenced by carbon cycle processes in South America. However, quantification of basin-wide seasonality of flux partitioning between latent and sensible heat, the response to anomalies around climatic norms, and understanding of the processes and mechanisms that control the carbon cycle remains elusive. Here, we investigate simulated surface-atmosphere interaction at a single site in Brazil, using models with different representations of precipitation and cloud processes, as well as differences in scale of coupling between the surface and atmosphere. We find that the model with parameterized clouds/precipitation has a tendency toward unrealistic perpetual light precipitation, while models with explicit treatment of clouds produce more intense and less frequent rain. Models that couple the surface to the atmosphere on the scale of kilometers, as opposed to tens or hundreds of kilometers, produce even more realistic distributions of rainfall. Rainfall intensity has direct consequences for the “fate of water,” or the pathway that a hydrometeor follows once it interacts with the surface. We find that the model with explicit treatment of cloud processes, coupled to the surface at small scales, is the most realistic when compared to observations. These results have implications for simulations of global climate, as the use of models with explicit (as opposed to parameterized) cloud representations becomes more widespread.National Aeronautics and Space Administration (NASA)National Science Foundation (NSF)National Science Foundation (NSF)U.S. Department of Energy (DOE

    Estimating parameters for probabilistic linkage of privacy-preserved datasets.

    Get PDF
    Background: Probabilistic record linkage is a process used to bring together person-based records from within the same dataset (de-duplication) or from disparate datasets using pairwise comparisons and matching probabilities. The linkage strategy and associated match probabilities are often estimated through investigations into data quality and manual inspection. However, as privacy-preserved datasets comprise encrypted data, such methods are not possible. In this paper, we present a method for estimating the probabilities and threshold values for probabilistic privacy-preserved record linkage using Bloom filters. Methods: Our method was tested through a simulation study using synthetic data, followed by an application using real-world administrative data. Synthetic datasets were generated with error rates from zero to 20% error. Our method was used to estimate parameters (probabilities and thresholds) for de-duplication linkages. Linkage quality was determined by F-measure. Each dataset was privacy-preserved using separate Bloom filters for each field. Match probabilities were estimated using the expectation-maximisation (EM) algorithm on the privacy-preserved data. Threshold cut-off values were determined by an extension to the EM algorithm allowing linkage quality to be estimated for each possible threshold. De-duplication linkages of each privacy-preserved dataset were performed using both estimated and calculated probabilities. Linkage quality using the F-measure at the estimated threshold values was also compared to the highest F-measure. Three large administrative datasets were used to demonstrate the applicability of the probability and threshold estimation technique on real-world data. Results: Linkage of the synthetic datasets using the estimated probabilities produced an F-measure that was comparable to the F-measure using calculated probabilities, even with up to 20% error. Linkage of the administrative datasets using estimated probabilities produced an F-measure that was higher than the F-measure using calculated probabilities. Further, the threshold estimation yielded results for F-measure that were only slightly below the highest possible for those probabilities. Conclusions: The method appears highly accurate across a spectrum of datasets with varying degrees of error. As there are few alternatives for parameter estimation, the approach is a major step towards providing a complete operational approach for probabilistic linkage of privacy-preserved datasets

    Supergravity for Effective Theories

    Full text link
    Higher-derivative operators are central elements of any effective field theory. In supersymmetric theories, these operators include terms with derivatives in the K\"ahler potential. We develop a toolkit for coupling such supersymmetric effective field theories to supergravity. We explain how to write the action for minimal supergravity coupled to chiral superfields with arbitrary numbers of derivatives and curvature couplings. We discuss two examples in detail, showing how the component actions agree with the expectations from the linearized description in terms of a Ferrara-Zumino multiplet. In a companion paper, we apply the formalism to the effective theory of inflation.Comment: 26 page

    Lorentz Violation in Warped Extra Dimensions

    Get PDF
    Higher dimensional theories which address some of the problematic issues of the Standard Model(SM) naturally involve some form of D=4+nD=4+n-dimensional Lorentz invariance violation (LIV). In such models the fundamental physics which leads to, e.g., field localization, orbifolding, the existence of brane terms and the compactification process all can introduce LIV in the higher dimensional theory while still preserving 4-d Lorentz invariance. In this paper, attempting to capture some of this physics, we extend our previous analysis of LIV in 5-d UED-type models to those with 5-d warped extra dimensions. To be specific, we employ the 5-d analog of the SM Extension of Kostelecky et. al. ~which incorporates a complete set of operators arising from spontaneous LIV. We show that while the response of the bulk scalar, fermion and gauge fields to the addition of LIV operators in warped models is qualitatively similar to what happens in the flat 5-d UED case, the gravity sector of these models reacts very differently than in flat space. Specifically, we show that LIV in this warped case leads to a non-zero bulk mass for the 5-d graviton and so the would-be zero mode, which we identify as the usual 4-d graviton, must necessarily become massive. The origin of this mass term is the simultaneous existence of the constant non-zero AdS5AdS_5 curvature and the loss of general co-ordinate invariance via LIV in the 5-d theory. Thus warped 5-d models with LIV in the gravity sector are not phenomenologically viable.Comment: 14 pages, 4 figs; discussion added, algebra repaire

    Charming CP Violation and Dipole Operators from RS Flavor Anarchy

    Full text link
    Recently the LHCb collaboration reported evidence for direct CP violation in charm decays. The value is sufficiently large that either substantially enhanced Standard Model contributions or non-Standard Model physics is required to explain it. In the latter case only a limited number of possibilities would be consistent with other existing flavor-changing constraints. We show that warped extra dimensional models that explain the quark spectrum through flavor anarchy can naturally give rise to contributions of the size required to explain the the LHCb result. The D meson asymmetry arises through a sizable CP-violating contribution to a chromomagnetic dipole operator. This happens naturally without introducing inconsistencies with existing constraints in the up quark sector. We discuss some subtleties in the loop calculation that are similar to those in Higgs to \gamma\gamma. Loop-induced dipole operators in warped scenarios and their composite analogs exhibit non-trivial dependence on the Higgs profile, with the contributions monotonically decreasing when the Higgs is pushed away from the IR brane. We show that the size of the dipole operator quickly saturates as the Higgs profile approaches the IR brane, implying small dependence on the precise details of the Higgs profile when it is quasi IR localized. We also explain why the calculation of the coefficient of the lowest dimension 5D operator is guaranteed to be finite. This is true not only in the charm sector but also with other radiative processes such as electric dipole moments, b to s\gamma, \epsilon'/\epsilon_K and \mu\ to e\gamma. We furthermore discuss the interpretation of this contribution within the framework of partial compositeness in four dimensions and highlight some qualitative differences between the generic result of composite models and that obtained for dynamics that reproduces the warped scenario.Comment: 14 page

    Evaluating privacy-preserving record linkage using cryptographic long-term keys and multibit trees on large medical datasets.

    Get PDF
    Background: Integrating medical data using databases from different sources by record linkage is a powerful technique increasingly used in medical research. Under many jurisdictions, unique personal identifiers needed for linking the records are unavailable. Since sensitive attributes, such as names, have to be used instead, privacy regulations usually demand encrypting these identifiers. The corresponding set of techniques for privacy-preserving record linkage (PPRL) has received widespread attention. One recent method is based on Bloom filters. Due to superior resilience against cryptographic attacks, composite Bloom filters (cryptographic long-term keys, CLKs) are considered best practice for privacy in PPRL. Real-world performance of these techniques using large-scale data is unknown up to now. Methods: Using a large subset of Australian hospital admission data, we tested the performance of an innovative PPRL technique (CLKs using multibit trees) against a gold-standard derived from clear-text probabilistic record linkage. Linkage time and linkage quality (recall, precision and F-measure) were evaluated. Results: Clear text probabilistic linkage resulted in marginally higher precision and recall than CLKs. PPRL required more computing time but 5 million records could still be de-duplicated within one day. However, the PPRL approach required fine tuning of parameters. Conclusions: We argue that increased privacy of PPRL comes with the price of small losses in precision and recall and a large increase in computational burden and setup time. These costs seem to be acceptable in most applied settings, but they have to be considered in the decision to apply PPRL. Further research on the optimal automatic choice of parameters is needed

    Smeared versus localised sources in flux compactifications

    Get PDF
    We investigate whether vacuum solutions in flux compactifications that are obtained with smeared sources (orientifolds or D-branes) still survive when the sources are localised. This seems to rely on whether the solutions are BPS or not. First we consider two sets of BPS solutions that both relate to the GKP solution through T-dualities: (p+1)-dimensional solutions from spacetime-filling Op-planes with a conformally Ricci-flat internal space, and p-dimensional solutions with Op-planes that wrap a 1-cycle inside an everywhere negatively curved twisted torus. The relation between the solution with smeared orientifolds and the localised version is worked out in detail. We then demonstrate that a class of non-BPS AdS_4 solutions that exist for IASD fluxes and with smeared D3-branes (or analogously for ISD fluxes with anti-D3-branes) does not survive the localisation of the (anti) D3-branes. This casts doubts on the stringy consistency of non-BPS solutions that are obtained in the limit of smeared sources.Comment: 23 pages; v2: minor corrections, added references, version published in JHE

    Impact of exposure of methicillin-resistant Staphylococcus aureus to polyhexanide in vitro and in vivo.

    Get PDF
    Staphylococcus aureus (MRSA) resistant to decolonization agents such as mupirocin and chlorhexidine increase the need to develop alternative decolonization molecules. The absence of reported adverse reactions and bacterial resistance to polyhexanide makes it an excellent choice as topical antiseptic. In the present study we evaluated the in vitro and in vivo capacity to generate strains with reduced polyhexanide susceptibility and cross-resistance with chlorhexidine and/or antibiotics currently used in clinic. Here we report the in vitro emergence of reduced-susceptibility to polyhexanide by prolonged-stepwise exposure to low concentrations in broth culture. Reduced susceptibility to polyhexanide was associated with genomic changes in the mprF and purR genes, and with concomitant decreased susceptibility to daptomycin and other cell-wall active antibiotics. However, the in vitro emergence of reduced-susceptibility to polyhexanide did not result in cross-resistance to chlorhexidine antiseptic. During in vivo polyhexanide clinical decolonization treatment, neither polyhexanide reduced-susceptibility nor chlorhexidine cross-resistance were observed. Together, these observations suggest that polyhexanide could be used safely for decolonisation of carriers of chlorhexidine-resistant S. aureus strains but highlight the need for careful use of polyhexanide at low antiseptic concentrations
    • 

    corecore